专利摘要:
Method for automatically gripping, by a polyarticulated system (3) subjected to a vision system (9), an object (2) located in an area (4) capable of receiving at least one object (2), said polyarticulate system (3) comprising at least one gripping member (5) adapted to grip an object (2) by a specific area of said object (2). According to the invention, the method comprises at least the steps of: - capturing an image of the reception area (4) by means of the vision system (9); processing the information resulting from the 3D image and identifying all the specific zones that can comprise the objects (2) to be grasped, and compatible with the one or more gripping members (5); - locate, in position and orientation, the identified specific compatible zone or zones; selecting one of the localized compatible specific zones and automatically defining, for the corresponding gripping member (5), a catch trajectory of the corresponding object (2), by the selected compatible specific zone; - enter the corresponding object (2) according to the defined trajectory
公开号:FR3020303A1
申请号:FR1453725
申请日:2014-04-25
公开日:2015-10-30
发明作者:Herve Emmanuel Henry;Florian Sella;Romain Bregier
申请人:SILEANE;
IPC主号:
专利说明:

[0001] METHOD AND INSTALLATION FOR AUTOMATIC PRETENSION OF AN OBJECT. TECHNICAL FIELD The present invention relates to the technical sector of the manipulation of objects by a polyarticulate system. By the term "polyarticulate system" is meant a six-axis robotic system for example. The invention more particularly relates to a method of automatically gripping an object by a polyarticulate system subject to a vision system. The invention finds an advantageous application when it is appropriate, for example, to sort randomly oriented objects and conveyed one after the other, or when it is appropriate for example to extract one by one objects from a bulk volume. By the term "bulk volume" is meant a cluster of objects, different or different, randomly oriented in a volume. In general, the invention relates to all applications in which it is necessary to grasp an object via a polyarticulate system. PRIOR ART It is known from the state of the art, and in particular patent application FR 2 987 685, a method of controlling a robot for moving at least one object disposed on a support in a device. 3D space. This method notably comprises the steps of: acquisition of at least two 2D images of the 3D space; segmentation of each previously acquired 2D image, intended to extract from each 2D image at least one 2D geometric primitive; searching for correspondences between the 2D geometric primitives extracted from distinct 2D images; calculating the coordinates of a 3D geometric primitive contained in the 3D space for each group of corresponding 2D geometric primitives; associating each 3D geometric primitive with a known object; searching for the position and orientation in the 3D space of each known object to which at least one 3D geometric primitive has been associated; - Moving via a robot of at least one of the known objects. Thus, this method offers the possibility of detecting, locating, and orienting objects in a 3D space.
[0002] However, this method has the disadvantage that it does not adapt to any type of object to move. Indeed, this method only works with objects previously known and identified by a learning software. This method can not be applied when it is appropriate, for example, to isolate one by one unknown objects, of different natures, and from a bulk volume. Such an application is for example to perform a sorting operation of a cluster of objects in a dump. SUMMARY OF THE INVENTION The invention aims to overcome the aforementioned drawbacks, and thus to provide a method of automatically gripping an unknown object, without prior learning of the object, and without prior learning of the setting trajectory of said object. The invention also aims to provide a method for, for example, sort randomly oriented objects, and conveyed one after the other, or extract one by one objects from a bulk volume. For this purpose, it has been developed a method of automatic gripping, by a polyarticulate system subject to a vision system, an unknown object located in an area adapted to receive at least one object. The polyarticulate system comprises at least one gripping member adapted to grip an object by at least one specific area of said object, and the method is remarkable in that it comprises at least the steps of: capturing an image of the area of reception by means of the vision system; processing the information resulting from the image and identifying all the specific zones that the objects to be grasped may contain, and compatible with the one or more gripping members; - locate, in position and orientation, the identified specific compatible zone or zones; selecting one of the specific localized compatible zones and automatically defining, for the corresponding gripping member, a gripping trajectory of the corresponding object, by the selected compatible specific zone; - enter the corresponding object according to the defined trajectory. Thus, the automatic gripping method does not require any learning of the object to be grasped. The object may be arbitrary, it is sufficient only that a specific area of the object 5 be detected and that this area is compatible with the gripping member of the polyarticulate system so that the object is grasped by this specific area. The seized object can then be processed anyway desired. It can for example, if the method is integrated in a sorting operation, be arranged in a specific receiving tray 10. The invention relates to all applications in which it is necessary to enter an object via a polyarticulate system. To sort randomly oriented objects, and conveyed one after the other, or to extract one by one objects from a bulk volume, simply repeat the process as many times as necessary. After each entry, the process resumes the first step of capturing an image of the receiving area. In this way, the method adapts in real time to the reception area. In other words, an object to be grabbed which has been moved by the input of a previous object will still be located and entered. The method is also advantageous in that it does not seek to recognize an object, but to recognize specific gripping areas on the objects to be grasped. In this way, objects of different sizes, shapes or materials can be grasped by the same gripping member and without requiring learning of the object. It suffices simply that the objects to be gripped have specific zones compatible with said gripping member. Since the compatible specific area is located in position and orientation, the method does not require any learning on the take-up path of the corresponding object. The capture trajectory is calculated in real time, and for each object to be captured. The method then makes it possible to perform operations of setting and very fast removal of unknown objects.
[0003] Advantageously, the method is intelligent and the step of choosing one of the localized compatible specific zones consists in weighting each identified zone with a coefficient which is a function of the success probabilities of the input by the control device. gripping, and to choose the area with the most probability of successes in the seizure. The weighting coefficient may be a function of any type of parameter, such as, for example, the distance separating the localized compatible specific area from the corresponding gripping member. The weighting coefficient can also be calculated according to the orientation of the area to be entered. The probabilities of success are a function of the gripping member used. In order to increase the speed of the gripping process, the step of defining a gripping path for the gripping member consists in defining the shortest and fastest trajectory. The reception zone defined in the method can be of any type. This zone may, for example, be constituted by a zone on a conveyor, which scrolls one by one of the objects to be grasped in front of the polyarticulate system. This zone may also be fixed and be constituted by a bulk volume of objects to be extracted one by one. When the object to be grasped is in a bulk consisting of a plurality of objects, the method may advantageously include a step of identifying the highest objects at altitude. The choice of the localized specific area can then be made among the areas located on the object identified as being the highest. In this way, the method avoids attempts to seize an object partially covered by another, and whose chances of success at entry are decreased. Advantageously, the method comprises a step consisting, on the one hand, in verifying whether the object has been gripped by a gripping member and, on the other hand, in choosing another localized compatible specific zone in the case of a number of unsuccessful entry attempts.
[0004] In this way, the method can not remain stuck on an object that a gripping member can not grasp. After a determined number of unsuccessful entry attempts, the process will automatically switch to another area to be entered, on the same or on another object, and with the same or another gripping member.
[0005] According to a particular embodiment, the step of the method consisting in identifying compatible specific areas of the objects to be grasped, consists of identifying continuous zones in the case where the gripping member is of the suction cup type.
[0006] By the term "suction cup" is meant both a suction cup operating with an air suction, a sucker magnetic type such a magnet for example. It is thus obvious that to be compatible with the suction cup, the identified continuous region must have a continuous region comprising a minimum surface corresponding to the active surface of the suction cup.
[0007] According to another particular embodiment, the step of the method consisting in identifying compatible specific areas of the objects to be grasped, consists of identifying edges, or generatrices, preferably parallel, in the case where the gripping member is clamp type. In this configuration, it is obvious that to be compatible with the clamp, the maximum distance between said edges or generators must be less than the maximum possible spacing between the jaws of said clamp. The invention also relates to an automatic gripping facility of an object. The installation comprises a polyarticulate system subject to a vision system, and an area adapted to receive at least one object to be grasped. The polyarticulate system comprises at least one gripping member adapted to grip an object by a specific area of said object. The vision system is able to capture an image of the reception area. According to the invention, the vision system and the polyarticulate system are subject to processing and calculation means capable, to process the information resulting from the captured image, to identify all the specific zones that the objects to be grasped may comprise. , and compatible with the one or more gripping members, to locate, in position and orientation, the identified specific compatible zone or zones, to select one of the specific localized compatible zones, and to define automatically, for the gripping member corresponding, a catch trajectory of the corresponding object, by the selected compatible specific area. This plant is thus able to implement the method according to the invention and comprises in this way all the aforementioned advantages of said method. According to different embodiments, the gripping member of the polyarticulate system may comprise at least one clamp and / or at least one suction cup. Advantageously, the one or more gripping members comprise elastic members arranged at the portion of said gripping members which is intended to come into contact with the object to be gripped. These elastic members have a flexible behavior to adapt to the nature of the object to be grasped. Thus a fragile object can be grasped without being broken. BRIEF DESCRIPTION OF THE DRAWINGS Other characteristics and advantages of the invention will emerge clearly from the description which is given hereinafter, by way of indication and in no way limitative, with reference to the appended figures, in which: FIG. a schematic representation in perspective of an automatic gripping facility of an object according to the invention; FIG. 2 is a schematic representation in perspective of the gripping members of the polyarticulate system; FIG. 3 is a diagrammatic representation in perspective of a grip member of the gripper type of the polyarticulate system; FIG. 4 is a diagrammatic representation in perspective of a grip member of the suction cup type of the polyarticulate system. DETAILED DESCRIPTION OF THE INVENTION With reference to FIG. 1, an automatic gripping device (1) for an object (2) according to the invention comprises a polyarticulated system (3) and a receiving zone (4). . The polyarticulate system (3) is, for example, a six-axis robot well known in the state of the art. The polyarticulated system (3) is capable of grasping an object (2) by a specific zone of said obj and (2). For this purpose, said polyarticulate system (3) comprises three gripping members (5), and is subject to a vision system (9). The objects (2) to be grasped are arranged randomly in bulk in the receiving zone (4). The objects (2) are for example of different sizes, shapes, and natures. The reception zone (4) illustrated is constituted by a bulk volume of objects (2) to be extracted one by one. With reference to FIG. 2, the grippers (5) can be of any suitable type. The essential lies in the fact that they are able to grasp an object (2) by a specific area of said object (2). In the illustrated embodiment, the grippers (5) are in the form of a gripper (6), a first suction pad (7a), and a second suction pad (7b). With reference to FIG. 3, the clamp (6) comprises two jaws (6a) able to move away from each other and to seize an object (2). The specific areas of an object (2) that can be grasped by said clamp (6) are edges or generatrices, preferably parallel. Obviously, the maximum distance between said ridges or generatrices 25 must be less than the maximum possible gap between the jaws (6a) of said clamp (6). The jaws (6a) advantageously comprise on their parts intended to be in contact with the object (2) to be grasped, an elastic member (8) having a flexible behavior to adapt to the nature of said object (2) to be grasped. With reference to Fig. 4, a suction cup (7) is, in a manner well known in the art, subject to suction means. This suction cup (7) comprises at its end a resilient member (8) in the form of a sleeve adapted to be applied to a specific area of an object (2) for grasping said object (2) by suction. The elastic sleeve (8) is intended to make contact with the object (2) to be grasped. This elastic sleeve 35 (8) has a first level of compliance, in particular determined to adapt to a certain type of fragile object (2). This sleeve (8) makes it possible to grasp fragile objects (2) without breaking them. The specific areas of an object (2) that can be grasped by a first suction cup (7) 5 are continuous regions, namely substantially planar surfaces and of dimensions adapted to receive in abutment with said elastic sleeve (8) of the suction cup ( 7). The first and second suckers (7a, 7b) are similar except for their elastic sleeves (8) which have different levels of compliance. The second suction cup (7b) is, for example, larger than the first (7a), to be adapted to objects (2) heavier and less fragile. The vision system (9), preferably arranged in line with the reception zone (4), is able to capture an image of the reception zone (4), and can be of any suitable type, such as for example a 3D camera, or a camera in stereo mode. In addition, the installation (1) comprises processing and calculation means subject to the polyarticulate system (3) and the vision system (9). These processing and calculation means are of any suitable type, and may for example be in the form of a microprocessor. These processing and calculation means are capable of processing the information resulting from the captured image in such a way as to identify, on the objects (2) to be grasped, edges or generatrices, preferably parallel, whose difference is less than the maximum gap of the jaws (6a) of the clamp (6). They are also able to identify, on the objects (2) to be grasped, flat surfaces whose dimensions are adapted to receive in support the elastic sleeves (8) of the first or second suction cups (7a, 7b). Said processing and calculation means also make it possible to locate, in position and orientation, said identified compatible specific zones, and to choose one of the localized compatible specific zones. The choice of the area to be grasped consists, for example, in weighting each identified area with a coefficient which is a function of the probabilities of success of the gripping by the corresponding gripping member (5), and of choosing the zone having the most probabilities of successes in seizure. In practice, it is a question of assigning weighting coefficients for each identified and compatible zone, according to one or more parameters. The parameters used can be of any kind such as, for example, the altitude of the zone to be grasped, the area of the visible surface, the inclination of a gripping zone. Each zone to enter therefore includes a weighting for each of the chosen parameters. Then we average each weighting of the area to obtain a score. The area with the highest score, or the smallest depending on the nature of the weighting, is chosen to be entered.
[0008] Once the specific area chosen, for example a flat surface that can be gripped by one of the suction cups (7a, 7b), or two parallel edges that can be grasped by the clamp (6), said processing and calculation means are adapted to automatically define, for the corresponding gripping member, a gripping trajectory of the corresponding object (2) by said selected compatible specific area.
[0009] The installation (1) according to the invention therefore makes it possible to implement a method of automatically gripping an unknown object (2). According to the invention, the method comprises the steps of: - capturing an image of the reception area (4) by means of the vision system (9); Processing the information resulting from the image and identifying all the specific zones that the objects (2) to be grasped can comprise, and compatible with the one or more gripping members (5); - locate, in position and orientation, the identified specific compatible zone or zones; Selecting one of the localized compatible specific zones and automatically defining, for the corresponding gripping member (5), a corresponding object (2) gripping trajectory, by the selected compatible specific zone; - enter the corresponding object (2) according to the defined trajectory. This method is advantageous in that it makes it possible to capture any object (2), without learning the object (2), nor the trajectories of said object (2). According to the invention, one of the gripping members (5) can be favored over the others. Indeed, an order of preference for use of the gripping members (5) can be established.
[0010] For this purpose, if the clamp (6) is first of all favored, during the processing of the image captured by the vision system (9), one of the specific localized areas is chosen only from those that can be entered. with said clip (6).
[0011] The method according to the invention is advantageous because it makes it possible to carry out a roughing of unknown objects (2) arranged in bulk, automatically and rapidly, without requiring the intervention of an operator, or prior learning of the object (2) or the trajectory of the object (2).
[0012] It is obvious that the method and the installation (1) according to the invention can implement a plurality of polyarticulated systems (3), each of which may be subject to the same vision system (9), or each may be subject to his own vision system (9). Thus, when one of the polyarticulated systems (3) moves an object (2) entered, another polyarticulate system (3) can, in time hidden on the displacement of said object (2) entered, enter another, and so after. This makes it possible to perform picking operations and removal of unknown objects (2) in an optimal and fast manner. After being seized, the object (2) can be processed in any desired manner, for example to be routed to a treatment station or deposited in a specific receiving tray.
权利要求:
Claims (12)
[0001]
REVENDICATIONS1. Automatic method of gripping, by a polyarticulated system (3) subjected to a vision system (9), an object (2) located in a zone (4) capable of receiving at least one objective and (2), said system polyarticulate member (3) comprising at least one gripping member (5) capable of grasping an object (2) by at least one specific zone of said object (2), characterized in that it comprises at least the steps of: - capturing a image of the reception area (4) by means of the vision system (9); processing the information resulting from the image and identifying all the specific zones that can comprise the objects (2) to be grasped, and compatible with the one or more gripping members (5); - locate, in position and orientation, the identified specific compatible zone or zones; selecting one of the localized compatible specific zones and automatically defining, for the corresponding gripping member (5), a catch trajectory of the corresponding object (2), by the selected compatible specific zone; - enter the corresponding object (2) according to the defined trajectory.
[0002]
2. Automatic method for gripping an object according to claim 1, characterized in that the step of choosing one of the localized specific compatible zones consists of weighting each identified zone with a coefficient which is a function of probabilities of success of the grasping by the corresponding gripping member (5), and of choosing the area with the most probability of successes in the seizure.
[0003]
3. Automatic method of gripping an object (2) according to any one of the preceding claims, characterized in that the step of defining a gripping trajectory consists in defining the shortest and fastest trajectory.
[0004]
4. A method of automatically gripping an object (2) according to any one of the preceding claims, characterized in that the object (2) to be grasped is in a bulk consisting of a plurality of objects (2). different to seize.
[0005]
5. Automatic method of gripping an object (2) according to claim 4, characterized in that it comprises a step of identifying the objects (2) highest at altitude, and in that the choice of the area Specific localized is among the localized areas on the object (2) identified as being the highest.
[0006]
6. A method of automatically gripping an object (2) according to any one of the preceding claims, characterized in that it comprises a step consisting, on the one hand, to check if the object (2) has been seized by a gripping member (5) and, secondly, to select another localized compatible specific area in the case of a determined number of unsuccessful entry attempts.
[0007]
7. Automatic method of gripping an object (2) according to any one of the preceding claims, characterized in that the step of identifying specific areas compatible objects (2) to enter, is to identify continuous areas in the case where the gripping member (5) is of the suction cup type (7).
[0008]
8. A method of automatically gripping an object (2) according to any one of the preceding claims, characterized in that the step of identifying compatible specific areas of the objects (2) to be entered, is to identify edges, or generators, in the case where the gripping member (5) is of the gripper type (6), the maximum distance between said ridges or generatrices to be less than the maximum possible gap between jaws (6a) of said clip (6).
[0009]
9. Installation (1) automatic gripping an object (2), said installation (1) comprising a polyarticulate system (3) subject to a vision system (9), and a zone (4) adapted to receive at least an object (2) to be grasped, said polyarticulate system (3) comprising at least one gripping member (5) able to grip an object (2) by a specific area of said object (2), said vision system (9) being adapted to capture an image of the reception zone (4), characterized in that said vision system (9) and said polyarticulate system (3) are subject to processing and calculation means adapted to process the resulting information of the captured image, to identify all the specific zones that can comprise the objects (2) to be grasped, and compatible with the one or more gripping members (5), to locate, in position and orientation, the specific zone or zones identified compatible, to choose one of the specific localized areas and to define automatically, for the corresponding gripping member (5), a gripping trajectory of the corresponding object (2), by the selected compatible specific area.
[0010]
10. Installation (1) automatic gripping an object (2) according to claim 9, characterized in that the gripping member (5) of the polyarticulate system (3) is in the form of at least one clamp (6).
[0011]
11. Installation (1) automatic gripping an object (2) according to any one of claims 9 and 10, characterized in that the gripping member (5) of the polyarticulate system (3) is in the form at least one suction cup (7).
[0012]
12. Installation (1) automatic gripping an object (2) according to any one of claims 9 to 11, characterized in that the or the gripping members (5) comprise elastic members (8) arranged at the level of the part of said gripping members (5) which is intended to come into contact with the object (2) to be grasped, said elastic members (8) having a flexible behavior to adapt to the nature of the object (2 ) to seize.
类似技术:
公开号 | 公开日 | 专利标题
EP3134234B1|2018-03-21|Method and facility for automatically gripping an object
EP3056288B1|2018-03-14|Selective sorting method and device
FR3032365A1|2016-08-12|SELECTIVE SORTING PROCEDURE
WO2016102820A1|2016-06-30|Facility for separating and individualising heterogeneous mail items
DE102009030461A1|2010-01-07|Device for recording objects
EP2552812A1|2013-02-06|Method and device for transferring cutouts for packaging boxes
FR2666315A1|1992-03-06|DEVICE FOR CONTROLLING AND REGULARIZING THE SPACING OF PARCELS, PACKAGES OR SIMILAR OBJECTS, ESPECIALLY POSTAL PARCELS.
JP6480489B2|2019-03-13|Adaptive gripper apparatus and method
EP1382714A3|2004-10-06|Apparatus for removing particles
WO2017093683A1|2017-06-08|Method and facility for composing a batch of parts from parts situated in different storage areas
CN108555902B|2021-05-25|Method and device for sorting articles by robot and robot
FR3063668A1|2018-09-14|CLIP-TYPE GRIPPING DEVICE AND SYSTEM COMPRISING SUCH DEVICES
EP2812161B1|2020-11-25|Unit and method for the automatic hooking of parts onto complex supports
FR3060538A1|2018-06-22|METHOD AND APPARATUS FOR SEPARATING A LINK SURROUNDING A LIASSE OF PRINTS FROM A PALLET
WO2019224202A1|2019-11-28|Method and system for unstacking a stack of wafers of semi-conductive material
WO2015118266A1|2015-08-13|Method and apparatus for depositing individual products on non-indexed, planar holders having recesses
US20190015873A1|2019-01-17|Methods and systems for sorting a plurality of components for directed self-assembly
EP3095565B1|2017-11-15|Method and facility for unhooking parts hanging on hooks
WO2019243674A1|2019-12-26|Apparatus and method for transferring, to a processing line, printed matter initially packaged as bundles
US20200144780A1|2020-05-07|Cable processing device
EP3772747A1|2021-02-10|Positioning tool
JP2019188561A|2019-10-31|Article gripping device and control device of article gripping device
FR3085612A1|2020-03-13|VERSATILE AND DEFORMABLE ELECTROMAGNETIC GRIPPING TOOL
FR2860775A1|2005-04-15|DEVICE FOR TRANSPORTING AND ALIGNING DISC-SHAPED ELEMENTS
WO2021084009A1|2021-05-06|Palletisation and depalletisation system and method
同族专利:
公开号 | 公开日
DK3134234T3|2018-04-30|
FR3020303B1|2016-07-15|
US20170050315A1|2017-02-23|
WO2015162390A1|2015-10-29|
JP2017513727A|2017-06-01|
PT3134234T|2018-05-25|
EP3134234B1|2018-03-21|
PL3134234T3|2018-08-31|
ES2666793T3|2018-05-07|
EP3134234A1|2017-03-01|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20130211593A1|2010-11-17|2013-08-15|Mitsubishi Electric Corporation|Workpiece pick-up apparatus|
US20140031985A1|2012-07-26|2014-01-30|Fanuc Corporation|Apparatus and method of taking out bulk stored articles by robot|
US20140067127A1|2012-08-29|2014-03-06|Fanuc Corporation|Apparatus and method of taking out bulk stored articles by robot|WO2017093683A1|2015-12-03|2017-06-08|Sileane|Method and facility for composing a batch of parts from parts situated in different storage areas|
EP3263292A1|2016-06-28|2018-01-03|Tata Consultancy Services Limited|Adaptive gripper device|
CN112775967A|2020-12-30|2021-05-11|中南民族大学|Mechanical arm grabbing method, device and equipment based on machine vision|JPS63288684A|1987-05-21|1988-11-25|Aichi Machine Ind|Automatic correction means of clamping mistake in robot for carrying work|
JPH0629142U|1992-09-10|1994-04-15|コマツ電子金属株式会社|Wafer suction jig|
JPH09239682A|1996-03-06|1997-09-16|Nissan Motor Co Ltd|Work feeding method and device|
JP3930490B2|2004-04-23|2007-06-13|ファナック株式会社|Article take-out device|
US20060047421A1|2004-08-25|2006-03-02|Microsoft Corporation|Computing point-to-point shortest paths from external memory|
JP5304469B2|2009-06-19|2013-10-02|株式会社デンソーウェーブ|Bin picking system|
FR2987685A1|2012-03-02|2013-09-06|Akeo Plus|Method for controlling robot for displacement of object i.e. nut, placed on support in three-dimensional space, involves researching position and spatial three dimensional orientation of each object, and carrying out displacement of object|
US8958912B2|2012-06-21|2015-02-17|Rethink Robotics, Inc.|Training and operating industrial robots|
US9233470B1|2013-03-15|2016-01-12|Industrial Perception, Inc.|Determining a virtual representation of an environment by projecting texture patterns|
FR3032365B1|2015-02-10|2017-02-03|Veolia Environnement-VE|SELECTIVE SORTING PROCEDURE|EP3341166A1|2015-08-26|2018-07-04|Berkshire Grey Inc.|Systems and methods for providing vacuum valve assemblies for end effectors|
EP3341163A1|2015-08-26|2018-07-04|Berkshire Grey Inc.|Systems and methods for providing contact detection in an articulated arm|
ES2845683T3|2015-09-01|2021-07-27|Berkshire Grey Inc|Systems and Methods for Providing Dynamic Robotic Control Systems|
CN108290297B|2015-09-08|2021-12-03|伯克希尔格雷股份有限公司|System and method for providing high flow vacuum acquisition in an automated system|
EP3347175B1|2015-09-09|2022-03-02|Berkshire Grey, Inc.|Systems and methods for providing dynamic communicative lighting in a robotic environment|
CA2998544A1|2015-09-11|2017-03-16|Berkshire Grey, Inc.|Robotic systems and methods for identifying and processing a variety of objects|
US10625432B2|2015-11-13|2020-04-21|Berkshire Grey, Inc.|Processing systems and methods for providing processing of a variety of objects|
US9937532B2|2015-12-18|2018-04-10|Berkshire Grey Inc.|Perception systems and methods for identifying and processing a variety of objects|
CN113478512A|2016-01-08|2021-10-08|伯克希尔格雷股份有限公司|System and method for acquiring and moving objects|
CA3014049C|2016-02-08|2021-06-22|Thomas Wagner|Systems and methods for providing processing of a variety of objects employing motion planning|
CN109983433A|2016-07-18|2019-07-05|L·奥德纳|Evaluate robot crawl|
CN106421946A|2016-10-11|2017-02-22|广西大学|Multi-degree-of-freedom automated cupping device|
CN110199231A|2016-11-08|2019-09-03|伯克希尔格雷股份有限公司|System and method for handling object|
EP3544911A1|2016-11-28|2019-10-02|Berkshire Grey Inc.|Systems and method for providing singulation of objects for processing|
CA3045522A1|2016-12-06|2018-06-14|Berkshire Grey, Inc.|Systems and methods for providing for the processing of objects in vehicles|
CA3139255A1|2016-12-09|2018-06-14|Thomas Wagner|Systems and methods for processing objects provided in vehicles|
US10639787B2|2017-03-06|2020-05-05|Berkshire Grey, Inc.|Systems and methods for efficiently moving a variety of objects|
CN110770149A|2017-04-24|2020-02-07|伯克希尔格雷股份有限公司|System and method for providing separation of objects for processing using object movement redistribution|
CN110958932A|2017-08-02|2020-04-03|伯克希尔格雷股份有限公司|System and method for acquiring and moving objects having complex exterior surfaces|
CN111315546A|2017-11-07|2020-06-19|伯克希尔格雷股份有限公司|System and method for providing dynamic vacuum pressure at an end effector using a single vacuum source|
JP2019126886A|2018-01-25|2019-08-01|株式会社リコー|Information processing system, picking section specifying method, and program|
CN112384342A|2018-08-06|2021-02-19|株式会社岛津制作所|Conveying device|
CN109483534A|2018-11-08|2019-03-19|腾讯科技(深圳)有限公司|A kind of grasping body methods, devices and systems|
KR102218246B1|2020-06-30|2021-02-22|파워오토메이션 주식회사|Gripping System for Hybrid Multi Insertion Robot Machine|
法律状态:
2015-04-30| PLFP| Fee payment|Year of fee payment: 2 |
2015-10-30| PLSC| Search report ready|Effective date: 20151030 |
2016-04-29| PLFP| Fee payment|Year of fee payment: 3 |
2017-04-27| PLFP| Fee payment|Year of fee payment: 4 |
2018-05-02| PLFP| Fee payment|Year of fee payment: 5 |
2020-01-10| ST| Notification of lapse|Effective date: 20191205 |
优先权:
申请号 | 申请日 | 专利标题
FR1453725A|FR3020303B1|2014-04-25|2014-04-25|METHOD AND INSTALLATION FOR AUTOMATIC PRETENSION OF AN OBJECT.|FR1453725A| FR3020303B1|2014-04-25|2014-04-25|METHOD AND INSTALLATION FOR AUTOMATIC PRETENSION OF AN OBJECT.|
DK15725792.4T| DK3134234T3|2014-04-25|2015-04-23|Procedure and installation to automatically grab an object|
ES15725792.4T| ES2666793T3|2014-04-25|2015-04-23|Procedure and installation of automatic clamping of an object|
US15/306,715| US20170050315A1|2014-04-25|2015-04-23|Method And Facility For Automatically Gripping An Object|
EP15725792.4A| EP3134234B1|2014-04-25|2015-04-23|Method and facility for automatically gripping an object|
PT157257924T| PT3134234T|2014-04-25|2015-04-23|Method and facility for automatically gripping an object|
PL15725792T| PL3134234T3|2014-04-25|2015-04-23|Method and facility for automatically gripping an object|
JP2017507087A| JP2017513727A|2014-04-25|2015-04-23|Automatic gripping method and equipment for target|
PCT/FR2015/051113| WO2015162390A1|2014-04-25|2015-04-23|Method and facility for automatically gripping an object|
[返回顶部]